Goto

Collaborating Authors

 signature kernel



Random Controlled Differential Equations

Piatti, Francesco, Cass, Thomas, Turner, William F.

arXiv.org Machine Learning

We introduce a training-efficient framework for time-series learning that combines random features with controlled differential equations (CDEs). In this approach, large randomly parameterized CDEs act as continuous-time reservoirs, mapping input paths to rich representations. Only a linear readout layer is trained, resulting in fast, scalable models with strong inductive bias. Building on this foundation, we propose two variants: (i) Random Fourier CDEs (RF-CDEs): these lift the input signal using random Fourier features prior to the dynamics, providing a kernel-free approximation of RBF-enhanced sequence models; (ii) Random Rough DEs (R-RDEs): these operate directly on rough-path inputs via a log-ODE discretization, using log-signatures to capture higher-order temporal interactions while remaining stable and efficient. We prove that in the infinite-width limit, these model induces the RBF-lifted signature kernel and the rough signature kernel, respectively, offering a unified perspective on random-feature reservoirs, continuous-time deep architectures, and path-signature theory. We evaluate both models across a range of time-series benchmarks, demonstrating competitive or state-of-the-art performance. These methods provide a practical alternative to explicit signature computations, retaining their inductive bias while benefiting from the efficiency of random features.


Signature Kernel Scoring Rule as Spatio-Temporal Diagnostic for Probabilistic Forecasting

Dodson, Archer, Dutta, Ritabrata

arXiv.org Machine Learning

Modern weather forecasting has increasingly transitioned from numerical weather prediction (NWP) to data-driven machine learning forecasting techniques. While these new models produce probabilistic forecasts to quantify uncertainty, their training and evaluation may remain hindered by conventional scoring rules, primarily MSE, which ignore the highly correlated data structures present in weather and atmospheric systems. This work introduces the signature kernel scoring rule, grounded in rough path theory, which reframes weather variables as continuous paths to encode temporal and spatial dependencies through iterated integrals. Validated as strictly proper through the use of path augmentations to guarantee uniqueness, the signature kernel provides a theoretically robust metric for forecast verification and model training. Empirical evaluations through weather scorecards on WeatherBench 2 models demonstrate the signature kernel scoring rule's high discriminative power and unique capacity to capture path-dependent interactions. Following previous demonstration of successful adversarial-free probabilistic training, we train sliding window generative neural networks using a predictive-sequential scoring rule on ERA5 reanalysis weather data. Using a lightweight model, we demonstrate that signature kernel based training outperforms climatology for forecast paths of up to fifteen timesteps.


Non-adversarial training of Neural SDEs with signature kernel scores

Neural Information Processing Systems

Neural SDEs are continuous-time generative models for sequential data. State-of-the-art performance for irregular time series generation has been previously obtained by training these models adversarially as GANs.



pySigLib -- Fast Signature-Based Computations on CPU and GPU

Shmelev, Daniil, Salvi, Cristopher

arXiv.org Machine Learning

Signature-based methods have recently gained significant traction in machine learning for sequential data. In particular, signature kernels have emerged as powerful discriminators and training losses for generative models on time-series, notably in quantitative finance. However, existing implementations do not scale to the dataset sizes and sequence lengths encountered in practice. We present pySigLib, a high-performance Python library offering optimised implementations of signatures and signature kernels on CPU and GPU, fully compatible with PyTorch's automatic differentiation. Beyond an efficient software stack for large-scale signature-based computation, we introduce a novel differentiation scheme for signature kernels that delivers accurate gradients at a fraction of the runtime of existing libraries.

  salvi, signature, signature kernel, (12 more...)
2509.10613
  Country: Europe > United Kingdom > England > Oxfordshire > Oxford (0.04)
  Genre: Research Report (0.82)

Expected Signature Kernels for Lévy Rough Paths

Friz, Peter K., Hager, Paul P.

arXiv.org Machine Learning

The expected signature kernel arises in statistical learning tasks as a similarity measure of probability measures on path space. Computing this kernel for known classes of stochastic processes is an important problem that, in particular, can help reduce computational costs. Building on the representation of the expected signature of (inhomogeneous) Lévy processes with absolutely continuous characteristics as the development of an absolutely continuous path in the extended tensor algebra [F.-H.-Tapia, Forum of Mathematics: Sigma (2022), "Unified signature cumulants and generalized Magnus expansions"], we extend the arguments developed for smooth rough paths in [Lemercier-Lyons-Salvi, "Log-PDE Methods for Rough Signature Kernels"] to derive a PDE system for the expected signature of inhomogeneous Lévy processes. As a specific example, we see that the expected signature kernel of Gaussian martingales satisfies a Goursat PDE.



Manifold-regularised Large-Margin $\ell_p$-SVDD for Multidimensional Time Series Anomaly Detection

Arashloo, Shervin Rahimzadeh

arXiv.org Artificial Intelligence

We generalise the recently introduced large-margin $\ell_p$-SVDD approach to exploit the geometry of data distribution via manifold regularising for time series anomaly detection. Specifically, we formulate a manifold-regularised variant of the $\ell_p$-SVDD method to encourage label smoothness on the underlying manifold to capture structural information for improved detection performance. Drawing on an existing Representer theorem, we then provide an effective optimisation technique for the proposed method. We theoretically study the proposed approach using Rademacher complexities to analyse its generalisation performance and also provide an experimental assessment of the proposed method across various data sets to compare its performance against other methods.


Kernel Learning for Mean-Variance Trading Strategies

Futter, Owen, Cirone, Nicola Muca, Horvath, Blanka

arXiv.org Artificial Intelligence

Trading strategy construction and portfolio choice is a fundamental problem in quantitative finance, where traders and researchers aim to balance maximising PnL with the associated constraints that they face such as the information they are in possession of, volatility, liquidity and the associated costs of trading. In this work, we are concerned with finding optimal dynamic, path-dependent trading strategies in which the past trajectory of information is incorporated into the decision making as new information filters in. In particular, we solve an optimal portfolio choice problem where the inventory (position) is given as the control variate and derive a solution to the mean-variance criterion. Path-dependencies are ubiquitous in finance. They are present both in the underlying asset time series, either arising due to path-dependent volatility [Guy14; GL22] or due to price discovery (stochastic drift) as market participants react to diverse sources of information that arrives at differing speeds [Tót+11; Con+14; Oha15].